Metric Selection in Douglas-Rachford Splitting and ADMM
نویسندگان
چکیده
Recently, several convergence rate results for Douglas-Rachford splitting and the alternating direction method of multipliers (ADMM) have been presented in the literature. In this paper, we show linear convergence of Douglas-Rachford splitting and ADMM under certain assumptions. We also show that the provided bounds on the linear convergence rates generalize and/or improve on similar bounds in the literature. Further, we show how to select the algorithm parameter to optimize the provided linear convergence rate bound. For smooth and strongly convex finite dimensional problems, we show how the linear convergence rate bounds depend on the metric that is used in the algorithm, and we show how to select this metric to optimize the bound. Since most real-world problems are not both smooth and strongly convex, we also propose heuristic metric and parameter selection methods to improve the performance of a much wider class of problem that not satisfy both these assumptions. These heuristic methods can be applied to problems arising, e.g., in compressed sensing, statistical estimation, model predictive control, and medical imaging. The efficiency of the proposed heuristics is confirmed in a numerical example on a model predictive control problem, where improvements of more than one order of magnitude are observed.
منابع مشابه
A New Use of Douglas-Rachford Splitting and ADMM for Identifying Infeasible, Unbounded, and Pathological Conic Programs
In this paper, we present a method for identifying infeasible, unbounded, and pathological conic programs based on Douglas-Rachford splitting, or equivalently ADMM. When an optimization program is infeasible, unbounded, or pathological, the iterates of Douglas-Rachford splitting diverge. Somewhat surprisingly, such divergent iterates still provide useful information, which our method uses for i...
متن کاملFaster Convergence Rates of Relaxed Peaceman-Rachford and ADMM Under Regularity Assumptions
Splitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which the simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable algorithms, which often obtain nearly state-of-the-art perform...
متن کاملOn the Step Size of Symmetric Alternating Directions Method of Multipliers
The alternating direction method of multipliers (ADMM) is an application of the Douglas-Rachford splitting method; and the symmetric version of ADMM which updates the Lagrange multiplier twice at each iteration is an application of the Peaceman-Rachford splitting method. Sometimes the symmetric ADMM works empirically; but theoretically its convergence is not guaranteed. It was recently found th...
متن کاملLocal Convergence Properties of Douglas–Rachford and ADMM
The Douglas–Rachford (DR) and alternating direction method of multipliers (ADMM) are two proximal splitting algorithms designed to minimize the sum of two proper lower semi-continuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local linear convergence behaviour of DR/ADMM when the involved functions are moreover partly smooth. More...
متن کاملActivity Identification and Local Linear Convergence of Douglas-Rachford/ADMM under Partial Smoothness
Convex optimization has become ubiquitous in most quantitative disciplines of science, including variational image processing. Proximal splitting algorithms are becoming popular to solve such structured convex optimization problems. Within this class of algorithms, Douglas– Rachford (DR) and ADMM are designed to minimize the sum of two proper lower semi-continuous convex functions whose proximi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014